Chapter 2 The Ur Nerd of Capitalism

The Value Proposition

In the wee hours of the night, two young men are writing code, their faces pale in the blue light of the room. They haven’t taken a break in hours. One of them—his sandy brown hair disheveled, his oversized glasses flecked with grease—pours Tang from the jar directly into his palm and licks the orange powder. The sugar hits his bloodstream. He can keep going. There is a deadline looming and not a moment to be lost. Earlier, he had told a company that he had exactly the software they needed. The reality was a little different, but the duo eventually delivered. Out of that first piece of code they wrote in 1975 was born Microsoft.1

We love origin stories of tech wunderkinds like Bill Gates and Paul Allen, the founders of Microsoft. Tales of sleep-starved young men (almost always men) building the companies of their dreams in dorm rooms and rented garages—fueled by little other than caffeine, drugs, sugar, and an unwavering determination to change the world, and becoming fantastically rich in the process—feed the narrative of American innovation and individual success. In recent decades, with successful entrepreneurs in Silicon Valley and elsewhere producing unicorns and decacorns by the dozen, the “nerd founder” has reached folkloric status. Nerds, with their unruly hair, social ineptitude, and uncanny brilliance, are objects of wonder, parody, and envy, and even worthy of emulation. They inspire both popular culture and academic study. Nerd brains and the fruits they bear have become American capitalism’s greatest asset, creating immense value for society and cementing the country’s top spot in technology.

“The lonely-nerd-turned-accidental-billionaire narrative has assumed the mantle of the Great American Success Story,” the historian Nathan Ensmenger writes in “Beards, Sandals, and Other Signs of Rugged Individualism,” his 2015 paper about how computer professionals of the 1960s and 1970s built a masculine identity around programming. The computer nerd, Ensmenger writes, thus became a “stock character in the repertoire of American popular culture, his defining characteristics (white, male, middle-class, uncomfortable in his body and awkward around women) well established.”2

There is perhaps no nerd more representative of the early coalescence of technology, popular culture, and capitalism than Gates, the one with the greasy glasses. Of all the early tech savants, the Microsoft cofounder had serious programming credentials and a deep understanding of technology. But arguably, his biggest victory lay in his business vision. Gates saw a commercial market for software where none existed and built one of the world’s biggest companies based on that vision. A year after Microsoft went public in 1986, Gates became America’s youngest billionaire and the first to make his fortune from technology. He was 31 years old.

He was also an awkward young man who could be imperious and intolerant of others. He chewed so often and so furiously on the stems of his glasses that the plastic ends frayed. The rhythm with which he rocked back and forth in his chair was a barometer of his engagement with a topic. He displayed bouts of frightening intensity and passion, but he was physically unassertive. Software was his primary language. In story after news story and book after book, writers made a point of mentioning his appearance and behavior in the same breath as Microsoft’s latest software. Even the occasional frosting of dandruff on his shoulders became a subject of private discussion among tech reporters. Gates still exhibits some of those tics. He slouches when he stands, slumps when he sits, often gesticulates wildly with his hands when he is animated or tucks them under his armpits when in listening mode. Sometimes, when making a point, his arms stretch as wide as the wings of an osprey in midflight. His feet tap in time to the pace of his speech. He studs his sentences with words like “neat” and “cool.” Gates once called his rocking and swaying body movements a “metronome” for his brain.3 Social niceties and small talk meant to lubricate the start of a conversation are lost on him. Repartee isn’t his forte, although people close to him say he has an offbeat charm that comes through in small settings; Buffett told this reporter that Gates has a “keen sense of humor.” Still, it can be excruciating to watch him work the room at a cocktail party, say, in Davos, or at dinner after a conference. As Ken Auletta wrote memorably in World War 3.0, his detailed book about Microsoft’s battle with the government over its monopolistic practices in the late 1990s, a conversation with Gates was “business sex, without the foreplay.”4

In the 1980s, as Microsoft became more dominant, and Gates more visible, he bestowed mystique and prestige on nerdiness in equal measure, legitimizing both a personality type and the popular image of one. Gates was hardly the only technological genius of his era who shared his physical attributes and his ambitions. But he occupied a unique perch in the nerd pantheon because his nerdiness and his business instincts were so in balance that they appeared to be the core formula for Microsoft’s success. His persona was as essential, it seemed, as Microsoft was unavoidable. He—the tech nerd and the businessman—was capitalism’s darling, a blueprint not only for investors hoping to unearth the next badly dressed boy billionaire, but also for endless popular culture portrayals.

Bob Muglia, a longtime technology executive who joined Microsoft in 1988 and worked there for 23 years, said that Gates set the tone for a lot of what we classify as nerd behavior, from the glasses to the arrogance. “He was the perfect persona.” In his long career in Seattle, and later in Silicon Valley as a top executive at Juniper Networks and chief executive of Snowflake Computing, Muglia, also a billionaire, observed that nerds are defined as much by their unpredictable behavior as by their physical appearance. “You hear things from them that you don’t expect. It’s not an uncommon characteristic, because they’re always thinking about random things,” he said. “They’re all on the spectrum,” he added, calling them “unique individuals” who have specific characteristics, including a taste for risk-taking, “which can get them into trouble in a number of ways, and the brilliance and the confidence that comes from that brilliance. They have a certain bravado that others don’t.”

When Gates entered Harvard University in 1973, the first cheap desktop computers were coming to market, made possible by advances in chip technology. But they needed software to become functional. As the Microsoft legend goes, Allen happened to read in Popular Electronics that the makers of a microcomputer called the MITS-Altair were looking for a programming language that could run on their hardware. The Altair was a foot and a half tall, and nearly as wide.

After Allen told Gates excitedly about the opportunity, they decided to make a pitch—without having written a single line of code. Gates then called Ed Roberts at Altair from his dorm at Currier House at Harvard, pretending to be Allen. They had decided that Gates would do the talking, but Allen, the older of the two, would be the one to go for the in-person meeting in case their ploy worked. “We’ve got a BASIC for the Altair that’s just about finished, and we’d like to come out and show it to you,” Gates said to Roberts.5 Allen would later recount that episode in his memoir: “I admired Bill’s bravado but worried that he’d gone too far, since we’d yet to write the first line of code.”

But Altair executives bought their pitch, and Gates and Allen coded, as day turned into night, hurriedly adapting an existing software program to deliver the product they had promised. Micro-Soft was born in the fall of 1975 (they dropped the hyphen some years later). Originally based in Albuquerque, New Mexico, the company relocated to the Seattle area in 1979. From the start, their vision was that there would be a desktop in every home, and the software that ran them should be paid for. That software is worth paying for might seem like a no-brainer today. But Gates and Allen were trying to build a business around the invisible stuff that makes computers work when all the excitement was around the design and development of hardware. Software was largely something that math and engineering students and hobbyists tinkered with on the side and shared freely, despite growing demand for operating systems from computer manufacturers like IBM. It was little surprise that the hobbyists, many of whom were reared in the 1960s counterculture movement that eschewed commercialism, irritated the young Gates. Their approach interfered with his business plan.

In the February 1976 issue of Computer Notes magazine, Gates wrote an open letter to computer hobbyists, lamenting the software piracy and lack of principles in their community. “As the majority of hobbyists must be aware, most of you steal your software,” Gates wrote in the brief but well-worded missive. “Hardware must be paid for, but software is something to share. Who cares if the people who worked on it get paid?” Gates wrote with a rhetorical flourish, displaying his fundamental belief that software was intellectual property that must be protected.6 By insisting that software was proprietary, Gates scrubbed off its countercultural ethos and imposed upon it the capitalist values that he had embraced from the start. A big helping of luck would soon supercharge his vision.

Once Microsoft gained some success with its programming language, it struck a deal with IBM to build an operating system for the hardware giant’s personal computer, which had come to market in 1981. Gates was able to secure an introduction to IBM because his mother, Mary Maxwell Gates, served on a committee of the nonprofit United Way at the same time as IBM’s then chairman John Opel, and happened to mention her son’s young company. Opel then mentioned Microsoft to some other IBM executives, the story goes, and decided to invite Microsoft to build a software program for its computer.7 Microsoft bought a disk operating system from Seattle Computer Products for around $50,000 and licensed it to IBM as MS-DOS. Microsoft also got IBM to agree to allow it to license MS-DOS to other hardware makers, thereby paving the way for it to sell software to as many computer manufacturers as there were in the market. The licensing model was Gates’s real innovation; over the decades it brought Microsoft hundreds of billions of dollars in revenue. It was the equivalent of a person negotiating an open relationship just as the field blossoms with potential partners. Not only were there dozens of computer manufacturers when Microsoft secured its licensing agreement, but advances in chip technology were already bringing down the average cost of a PC, making it affordable to households. And Big Blue’s PC was becoming the computer of choice in businesses and homes. Together, these events formed the bedrock upon which Microsoft built its empire.

Driven by a hard-nosed capitalism, Gates sought ways to build a business around Microsoft’s core product, seeking competitive advantages where he could and defending his company’s turf against potential threats. If that meant taking an existing program or application, tweaking it, and selling it as part of Microsoft’s flagship Windows software, so be it. The public spats with rivals, the ugly accusations of plagiarism and theft, the labeling of Microsoft products as shoddy, and the dismissals about his programming credentials—even if they upset him—were simply the cost of doing business. His ruthlessness also extended to personal relationships; in his 2011 memoir Idea Man, Allen, who died in 2018, said that Gates had tried to dilute his stake in Microsoft and even tried to buy out his cofounder with a lowball offer in 1983. That year, Allen had been diagnosed with non-Hodgkin’s lymphoma and begun working less, and Gates argued that Allen should therefore have a smaller stake.

At the same time, Steve Jobs—who, like Gates, could be mercurial and harsh, but who did not share the same nerdy traits—was building his reputation as an exacting design maven with a far more beguiling vision of what the personal computer could be. The Apple PC was an elegant, easy-to-use machine that Jobs had full control over, from the design specifics to the presentation. Gates and Jobs were early collaborators on software but also fierce rivals, each disdainful of the other. Jobs designed products that created desire—Larry Ellison, the cofounder of Oracle, once called Apple the only “lifestyle brand in the computer industry”—whereas Microsoft software, by comparison, was dull but dominant.8

In his biography of Jobs, Walter Isaacson compares the era of Jobs and Gates to the relationship between Albert Einstein and Niels Bohr in twentieth-century physics, or between Thomas Jefferson and Alexander Hamilton in early American governance. Those eras came to be “shaped by the relationship and rivalry of two orbiting superstars,” Isaacson writes. “For the first thirty years of the personal computer age, beginning in the late 1970s, the defining binary star system was composed of two high-energy college dropouts both born in 1955. Jobs was more intuitive and romantic and had a greater instinct for making technology usable, design delightful, and interfaces friendly.”9 He made no secret of his utter disgust for what he viewed as Microsoft’s shoddy products, with their lack of elegance and taste. “Bill is basically unimaginative and has never invented anything, which is why I think he’s more comfortable now in philanthropy than technology,” he told Isaacson. “He just shamelessly ripped off other people’s ideas.”

Who Is a Nerd?

The technology nerd did not emerge fully formed, like Aphrodite, out of a sea of computer cables. Rather, he is an artificial construct, Frankenstein’s monster born of necessity, stereotype, and early mythmaking in the tech industry. The etymology of the term “nerd” is somewhat obscure. Some point to a poem by Dr. Seuss from the 1950s, while others say it is derived from the word “nut,” meaning a crazy person. But the term itself didn’t become part of the everyday lexicon until the early days of the computing revolution, when “nerd” came to stand for an eccentric male who preferred coding and hacking to human interaction, and often held antiauthoritarian beliefs popular with the 1960s counterculture movement.

Ensmenger, the computer historian, has thought a lot about nerds. Genial, thoughtful, and almost boyish in his appearance, Ensmenger studied civil engineering at Princeton before becoming interested in the sociocultural aspects of computing. An associate professor at Indiana University, his research focuses on how masculine culture works within the computing industry, and how early stereotypes were formed. In particular, he was struck by the fact that when mainframe computers were the norm in the 1950s, many early programmers were women. Yet, the foundations of the modern computer era were laid in a hypermasculine environment of hobbyists dominated by the perpetually adolescent “whiz kid.”

In The Computer Boys Take Over, Ensmenger traces the rather arbitrary arc of that evolution. In the early 1960s, an IBM desktop computer cost millions of dollars, required roughly 10,000 square feet of air-conditioned space, and needed dozens of programmers to feed it operating instructions.10 Ensmenger underscores that there were no preconceived notions of who could be a programmer, and there was even an initial conflict between casting programming as an art versus a science. A recruitment ad by IBM from 1969, for instance, was quite open to who could be a programmer—a music composer; a lover of geometry; anyone with an orderly mind who enjoys anagrams, chess, or bridge; or at least a person with a “lively imagination.”

But as minicomputers moved from the worlds of science and academia to the broader market, their potential for use in business applications set off a mini manufacturing boom. And as computing technology evolved, the size of the computers too began to shrink, making them widely viable. The need for computer programmers increased, and with it, a more systematic way to recruit them. “It was clear that recruiting programmers a half dozen at a time with cute advertisements in The New York Times was not a sustainable strategy,” Ensmenger writes.11 From that desperate demand for people who could code emerged a set of qualities—often determined by aptitude tests, psychological profiles, and, generally, “deeply flawed scientific methodology”—that fed into the idea of the perfect programmer. “The primary selection mechanism used by the industry selected for antisocial, mathematically inclined males, and therefore antisocial, mathematically inclined males were overrepresented in the programmer population; this in turn reinforced the popular perception that programmers ought to be antisocial and mathematically inclined (and therefore male), and so on ad infinitum.” This demand-driven, unplanned and almost whimsical way of finding computer programmers would ultimately feed into the definition of who is a nerd. Casting about for solitary males to fill programming seats was one way to spread the nerd gospel. Equally though, a few landmark pieces of cultural documentation helped flesh out both the characteristics of the nerd and the environment in which he thrived most.

In 1972, the writer Stewart Brand wrote a piece for Rolling Stone called “Spacewar: Fanatic Life and Symbolic Death Among the Computer Bums.”12 Brand, who by then had achieved some fame as the publisher of the Whole Earth Catalog magazine, immersed himself in the lives of early coders and programmers like an ethnologist, observing their habits and behaviors and capturing “hacker” culture, as they were known at the time. A rabbi for the Bay Area counterculture movement in the 1960s, Brand followed his subjects as they played a computer game called Spacewar that had been created at the Massachusetts Institute of Technology. In the article, one of Brand’s subjects described the attributes of a standard computer bum: “A true hacker is not a group person. He’s a person who loves to stay up all night, he and the machine in a love-hate relationship… They’re kids who tended to be brilliant but not very interested in conventional goals. And computing is just a fabulous place for that, because it’s a place where you don’t have to be a PhD or anything else. It’s a place where you can still be an artisan. People are willing to pay you if you’re any good at all, and you have plenty of time for screwing around.” The interviewee, Alan Kay of the Xerox Research Center, called the term “hacker” “a term of derision and… the ultimate compliment.”

Who is Brand interviewing if not a self-described nerd? The term “hacker” has a very different connotation today than it did when Brand wrote his iconic piece, because it referred to a group of young men who braided their belief in the transcendent power of computing with aspects of the counterculture that rejected centralized authority—the very same people who so irritated Gates because they believed that software should be free, not proprietary.

Brand, in his robust and engaging prose, went further. Hackers were “a mobile new-found elite, with its own apparat, language, and character, its own legends and humor,” he wrote. “Those magnificent men with their flying machines, scouting a leading edge of technology which has an odd softness to it; outlaw country, where rules are not decree or routine so much as the starker demands of what’s possible.”

Within a decade of Brand’s piece, with the personal computing revolution at their doorstep, the anticommercial ethos of the hobbyists, the computer bums, and the hackers had been tamed by the staid ambitions of capitalism. The hacker was dragged from the frontier to the mainstream. At the same time, the conditions deemed essential to their success—the solitary environment free of traditional management rules, the competitiveness, the idea of work as its own reward—were kept intact.

In The Soul of a New Machine, published in 1981, the author Tracy Kidder described the race between two computer hardware companies, Data General and Digital Equipment Corporation, to build a new personal computer. Kidder, who won a Pulitzer Prize for the book, not only takes the reader into the deepest reaches of the computer itself, but also places us amid the dozens of young, sleep-deprived and sometimes self-taught engineers who work on their projects without a break, only the shadows on their chins marking the passing of time. We are invited into the drama of the competition, but equally, we are compelled by the creation of a new culture and mythology—of those who would come to be known as nerds.

Nerds 1.0 and 2.0

The 1980s were the go-go years for Wall Street. Hostile takeovers were new and thrilling, and celebrity dealmakers had cultural capital. It was a place driven by excessive greed, outsized bets, and financial scandal, and it was immortalized in books like Den of Thieves, The Predators’ Ball, and Barbarians at the Gate. Ivan Boesky and Michael Milken, for a time, became synonymous with financial crime. The takeover of RJR Nabisco established the private equity business as an untamed, acquisitive new force with the potential for economic harm rather than good.

Literature and popular culture provided the harmony to the main tune of Wall Street in that decade. In The Bonfire of the Vanities, a bombastic tale of race, class, and politics, Tom Wolfe portrayed New York as a city seething with ambition, greed, and ego—the Masters of the Universe! But nothing established Wall Street as a cesspit of amoral, selfish, and scheming individuals who will stop at nothing to make their payday more than the 1987 movie Wall Street, which made Gordon Gekko a fixture in the public imagination—in particular, the line that “greed is good.” Gekko, played by Michael Douglas, may have been making a more nuanced point about the nature of capitalism, but the line became a damning shorthand for Wall Street.

At the same time, a crop of young technology entrepreneurs was swiftly climbing up the ladder of influence and wealth. In addition to Gates and Jobs, there was Michael Dell, a college dropout who founded Dell Computer in 1984, rethinking the way that computers could be built and sold. Ellison, another dropout, cofounded Oracle in 1977, creating a commercially viable database software product to help businesses manage and store their data. Sun Microsystems, cofounded by Scott McNealy in 1982, changed the way computers talked to each other via networking software. The company remained independent until Oracle bought it in 2010.

Microsoft, Oracle, and Sun all went public in the same week in March 1986, but the “IPO of the year” title went to Microsoft, given the sheer excitement around the rapidly growing software company. In 1990, Microsoft became the first software company to cross $1 billion in sales. Five years later, in 1995, Gates became the richest man in America, a title he would hold for the better part of two decades. It had been twenty years since Gates cofounded Microsoft, and the company, now dominant and seemingly invincible, was long past its scrappy days. Gates, too, had lost some of his nerdy entrepreneurial sheen.

By the mid 1990s, America was in the throes of a hot and heavy love affair with the second iteration of the nerds: the drivers of the digital revolution. The sociologist Thomas Streeter has posited that part of the reason why the internet gained romantic status when it did was that Microsoft’s dominance “represented the uninspiring end of the garage start-up days in microcomputing,” motivating many technology-oriented students and young entrepreneurs to study the commercial potential of newer technologies. “Not only had the desktop computer become a commonplace of office life, but the companies that made microcomputers no longer seemed like the boisterous garage start-ups of popular capitalist mythology,” Streeter writes. “By 1990, the least glamorous of the 1980s microcomputer companies, Microsoft, had achieved that much-prized and much-hated state common to technology industries: a practical monopoly. The gray, arrogant, predictable monopoly of IBM had been overthrown and replaced—by another gray, arrogant, predictable monopoly.”13

At the same time, it was not lost on the new generation of would-be entrepreneurs or Wall Street investors that Microsoft’s unfathomable success had generated massive wealth. An investor who bought one share of Microsoft when it went public on March 13, 1986, would get a return of more than 8,500 percent a decade later. Or if you bought $1,000 worth of Microsoft stock at its IPO price of $21 per share, you would be sitting on more than $4 million today—enough to pay off your student debt, buy a loft apartment in downtown Manhattan, or retire comfortably in a mid-priced city.

The combination of new technologies, investor enthusiasm, and media excitement—not to mention, the implicit search for the next Bill Gates—created the perfect conditions for the dotcom boom. Hundreds of graduates of business schools and other top universities who might have chosen to go to Wall Street instead made their way to Silicon Valley. Investment bankers rushed to cater to the financial needs of fledgling companies, and the number of venture capital investors swelled. The breathless rush for deals meant that companies with poorly formed ideas and no profits were suddenly tapping millions of dollars. In 1993, Wired magazine launched. Dedicated to chronicling emerging technologies and their impact on culture and society, it quickly became a media touchstone. According to one statistic, during the year 1996, a Valley company went public every five days, creating millionaires overnight.14

The biggest beneficiary of that excitement was Mosaic, a browser programmed by Marc Andreessen and Eric Bina, two students at the National Center for Supercomputing Applications at the University of Illinois. “Surfing the web using Mosaic in the early days shared certain features with the early stages of a romantic affair, or the first phases of a revolutionary movement: the dreamlike experience of pointing, clicking, and watching images slowly appear generated a sense of anticipation, of possibility. Mosaic was not a case of desire satisfied, but of desire provoked,” Streeter writes. In April 1994, Jim Clark, a professor turned businessman who had found success with the hardware company Silicon Graphics, founded Netscape along with Andreessen. The two launched the Netscape Navigator browser six months later. Netscape’s public offering in 1995 made front page news. The New York Times wrote: “A 15-month-old company that has never made a dime of profit had one of the most stunning debuts in Wall Street history yesterday as investors rushed to pour their money into cyberspace.”15 It had $100 million in revenue and no profit.16

Andreessen, at six foot four inches tall, loomed large both physically and metaphorically as tech’s newest wunderkind, eliciting regular comparisons to Gates. Aspects of his persona became part of Netscape’s storytelling, just as Gates’s tics had once been studied and tied to his business maneuvers. People recount Andreessen showing up to meetings barefoot, in shorts, and proceeding to eat a messy Subway sandwich while discussing the intricacies of the browser.

Rosanne Siino, an experienced technology marketing executive who was one of the original team members of Netscape, saw the potential for a story centered around a 19-year-old founder who “doesn’t know how to wear a clean T-shirt and eats sandwiches messily.” Siino also pitched Andreessen as part of a new cadre of entrepreneurs different from Gates, who flew solo, and built the father–son dynamic between Clark and Andreessen into the company’s creation myth. Once a perception, a brand, or a story is created, it no longer matters who the actual person is, Siino said. Details of Andreessen’s “regular guy” life filtered through to the press. He appeared on the February 19, 1996, cover of Time magazine, barefooted, in jeans and sitting on a gilded throne. “The Golden Geeks,” the magazine called Andreessen and a crop of other emerging tech founders. “Who are they? How do they live? And what do they mean for America’s future?” At least one writer suggested that the “kid” could topple Gates.17

Andreessen, now a venture capitalist and arch defender of nerds, has dwelled on the type frequently, either to his 1.2 million followers on X, the company formerly known as Twitter, writing missives known as “tweetstorms,” or in interviews. In a 2022 interview, he pointed out the staying power of nerds: A lot of people in finance with MBAs joined the tech industry in the 1990s only to leave as soon as the dotcom bubble burst, and then returned to the industry after realizing there was much more potential. “The Harvard MBAs left for a while, at least until after the new hills were discovered—smartphones, social networking, Web 2.0, cloud [computing]. Then those people came back into tech,” he said. “That’s why the nerds are predictive and the MBAs aren’t.”18 In 2014, he wrote on X: “Silicon Valley is nerd culture, and we are the bro’s natural enemy.”

The bursting of the dotcom bubble was a mere blip in the long march of tech’s domination, because the next generation of tech behemoths led by nerdy founders was already sprouting. In 1998, Larry Page and Sergey Brin started Google out of a basement. Jeff Bezos quit his hedge fund job to start Amazon. Also, that year, Peter Thiel and a few others cofounded PayPal, letting people make hassle-free money transfers online. In 2004, Mark Zuckerberg started Facebook out of his college dorm room. Apple introduced the iPhone in 2007, the same year Netflix, whose founders Reed Hastings and Marc Randolph had pioneered the concept of renting DVDs by mail a decade earlier, introduced its streaming service. Wireless, GPS, and Bluetooth technologies became widely available for commercial use.

When the global economy entered a recession following the 2008 financial crisis, which once again highlighted the greed and recklessness of Wall Street, young technology companies were the few specks of light amid the gloom. Since then, the technology sector has grown at warp speed, destroying old businesses and creating new ones, pushing the boundaries of what is possible, reaping billions of dollars of profits, making founders and investors obscenely rich, and consolidating America’s position as the world’s technology leader. Today, the tech sector accounts for more than a quarter of the S&P 500 stock index. Analysts even anointed Apple, Alphabet, Amazon, Meta, Microsoft, Nvidia, and Tesla the “Magnificent Seven”—a reference to the 1960 Western remake of Akira Kurosawa’s Seven Samurai, in which seven gun-toting mercenaries team up to protect a village from bandits—because of their dominance and power. At $2.6 trillion, the digital economy, which typically includes software, services, and computing, made up just over 10 percent of the total gross domestic product of the United States in 2022, up from about 2 percent at the turn of the century, according to U.S. government data. For years, the sector has grown at a faster pace than U.S. GDP.19

The median annual wage for the computer and information technology industry was $100,532 as of May 2022, more than double the median annual wage for all other occupations.20 In the next decade, computer and information technology jobs are expected to grow 15 percent, much faster than the average rate for other jobs. Although technology has killed jobs, it has also added new ones, and net employment in the sector has continued to grow. Venture capital firms have seen gushers of cash from traditional money managers hoping to get a piece of the next big thing. In 2011, as the economy found its footing following the recession, venture capitalists invested $262 billion in roughly 8,000 companies. By 2022, they invested four times as much money into more than 27,000 companies.21 The first billion-dollar start-up was christened a “unicorn” because it was so rare. Unicorns then became so commonplace—there were more than 700 at the end of 2023—that the term “decacorn” had to be invented. At last count, 74 of the 400 richest Americans had made their billions from industries that can loosely be categorized as technology. Eight of the top 10 billionaires in the Forbes 2023 list of the world’s 400 richest people were tech billionaires—Musk, Bezos, Ellison, Page, Gates, Brin, Zuckerberg, and Steve Ballmer. Each had an estimated net worth of more than $100 billion. Their collective net worth of about $1.2 trillion exceeded the 2021 GDP of the Netherlands. If Bezos were a country, his net worth of $201 billion would rank him at fifty-fourth on the list of countries based on their gross domestic product, coming in just below Iraq but ahead of Ukraine. The divorces of tech billionaires too have created new and enormous fortunes, including those of MacKenzie Scott from Bezos and of Melinda French Gates. Tech fortunes have eclipsed many of the biggest on Wall Street, including the founders of hedge funds and private equity firms. Also, tech billionaires have accumulated their wealth far more swiftly than billionaires in other industries.

The financial dominance and overwhelming importance of the technology sector has conferred upon entrepreneurs and innovators nearly unrestrained power over society, culture, and the public imagination, not to mention our daily lives, feeding into preexisting biases about who is best positioned to lead us into the future. There has been no other historical period than the past decade and a half when the pace and breadth of technological change has been this uncontained, even magical. We are able to shop for anything online and have it delivered in minutes, or connect with friends from anywhere in the world holding a six-inch device in the palms of our hands. We can type a query into a white search bar and find instantaneous results—so much so that “Google” is now a transitive verb. A chatbot employing artificial intelligence can hold a conversation with a human, or summarize billions of pages of data in seconds. Our data is stored virtually in the cloud, allowing us to access our email and our photographs from anywhere. Cars can now plug into electric sockets to recharge their batteries. Navigation, which once required paper maps, is now an application on a smartphone. Technology is so essential to the way we go about our lives that we no longer stop to think about it. Novelists might imagine futures, but technologists bring them to life.

Myths of the Nerd Ecosystem

“There are elements of truth in all mythology, along with a good dose of exaggeration that I have not contributed to,” Gates once told Playboy magazine, speaking about himself at the height of Microsoft’s reign in the 1990s.22 Myths are sustained by our beliefs and our storytelling. They stretch reality to make the inexplicable explicable, and the irrational rational. Stereotypes, on the other hand, are sustained by our ignorance. They are lazy associations we make, signifiers with only a casual relation to the truth. In the twentieth and twenty-first centuries, perhaps no other industry has seen the cohabitation of sociocultural mythmaking and stereotyping as much as the technology industry, a combination of hype, hysteria, and hagiography. It is hard not to mythologize the individuals whose creations have transformed the ways in which we live and think at bewildering speed, and who even appear to predict the future. Since the Industrial Revolution, inventors of technology, from Thomas Alva Edison to Henry Ford, were seen as exceptional, according to Ensmenger, the historian. Early biographies of Edison portrayed the inventor of the light bulb in terms of the metaphysical—someone who brought “light to darkness,” he said.

Gates may be known more for his philanthropy in the past couple of decades than his life as a technology executive, but the Microsoft cofounder helped build some of the stickiest myths about the environments in which technological greatness thrives. Take the myth that tech founders are young; often they are also college dropouts. Because so many successful technology companies have been started by wet-behind-the-ears twentysomethings, the young are seen as intuitively understanding that which is about to happen and that which is yet to come, while the skill of their venture capital backers (many of whom were once young founders) lies in betting on the right youngsters. That kind of you-don’t-get-it attitude, as Streeter, the sociologist, describes it, creates an environment in which outsiders and even older entrepreneurs might be wary of criticizing or questioning what’s in front of them. “Express doubts, and you risk being worse than wrong, you risk revealing yourself to be a dinosaur and thus no longer part of the privileged club; you just don’t get it.”23

The truth can be more nuanced. In a 2018 academic study, researchers found that successful tech entrepreneurs, on average, are in their forties. Their findings, based on tax filings, census data, and other federal data, help to show how certain images—the young white male, in this case—capture the popular imagination even when a closer inspection of the subject might indicate otherwise.24 The researchers also found that only one-fifth of all billionaires are dropouts.

A second myth—so persistent as to be tired—is that of the garage, the dorm room, the basement, and increasingly, start-up accelerators like Y Combinator, as the starting point for world-changing innovation, mainly because some of the world’s most successful companies, including Microsoft, Facebook, Apple, Google, and Amazon, got their start in one of those locations. The garage where Hewlett-Packard was founded is such a point of interest that it was christened the birthplace of Silicon Valley in 1989 and added to the National Register of Historic Places in 2007. It doesn’t matter that these locations were favored because they were mostly rent-free or convenient, or that there are millions of technology start-ups that were founded in more welcoming spots. As entry points for someone’s unimaginable success, garages, basements, and dorm rooms have also become entry points for our collective storytelling. The garage-to-billionaire stories of tech founders also map nicely onto the rags-to-riches storyline of the American dream.

Another of our most abiding myths, which feeds into the broader American narrative of the self-made individual, is that the founders of tech companies willed their creations into being simply with their brains and sweat. That is true to an extent, but several factors have propped up technology development over the decades. Silicon Valley’s early success was largely subsidized by federal research funds that went to Stanford University that nurtured technologies, including the Defense Advanced Research Projects Agency (DARPA). The internet was built within the world of research with government funds. Global positioning system, or GPS, technology, was developed by the U.S. military during the Cold War and became commercially available in the 1980s. (The government still owns and operates the technology.) Until rising inflation forced the Federal Reserve to change monetary policy in 2022, a long stretch of low interest rates left big investors such as pension funds searching for better yields and returns on their investments, leading them to pour money into riskier assets—along with the fear of missing out on the next big tech hit. That ended up driving so much money into venture capital funds that they were able to support loss-making start-ups for longer, allowing fledgling companies a bigger shot at success. What’s more, many of the start-up hits of two decades or so have come by repurposing traditional businesses for mobile platforms, which has been made far easier by falling costs of computing technologies—by now an almost formulaic rather than inventive approach.

A fourth myth is that start-up founders are out to change the world, reshaping and reimagining our lives to benefit all of humanity. And they largely have. In a poor country like Bangladesh, a mobile phone in every hand has meant its citizens can get direct deposits from the government without relying on an inadequate banking system. Communications technology allows seamless video calling between continents, via apps like FaceTime and WhatsApp, and enabled world-shaping meetings to be conducted entirely online during the pandemic. Cloud computing lets small businesses rent space without having to invest in their own data centers.

Elizabeth Spiers, a media entrepreneur and opinion columnist who founded the gossip blog Gawker, decried the myth, though, calling out founders and venture capitalists on their hypocrisy when they say they are out to change the world. “They’re out to change the world, but they’re not out to make money,” Spiers said, rolling her eyes during a video interview. Wall Street speculators are hardly saints, but as Spiers pointed out, at least “hedge fund founders don’t have that narrative about themselves, that we’re out to change the world.” But the myth, which we indulge, allows tech founders to claim credit for all the innovation and none of the downsides, Spiers said. We thus ignore the monopolistic practices of many of today’s biggest tech companies and accept the giant toxic spew and total control of the ways in which we interact and inform—even if they have directly contributed to some of the worst tendencies today—as the price we pay for ease, convenience, and connection. As misinformation, disinformation, social polarization, and conspiracy theories threaten the very foundations of democracy, personal data and secrets about our lives and search habits get ever more monetized.

A fifth myth is that creativity and genius can manifest only through nonconformist behavior, outside of the drudgery of a job and unconstrained by the burden of formal clothing—and without government interference and regulation. At the same time, while entrepreneurs and venture capitalists want to be left alone in the name of innovation, they also want to be bailed out in the name of innovation. When Silicon Valley Bank failed in March 2023, a contingent of venture capitalists and other big investors who typically want the government to stay away, called on the very same government to help save the start-up industry, because the fledgling companies are the heart and soul of the so-called innovation economy.

The myths around how start-up founders are created and how they should behave are so pervasive that they have created an opportunity for signaling and subterfuge, matched only by the greed of investors who substituted so-called pattern matching for due diligence. Elizabeth Holmes, the founder of Theranos, the blood-testing start-up that collapsed in 2018, contorted herself to fit the stereotype. Dozens of commentators have remarked that Holmes, a Stanford dropout who was sentenced to eleven years in prison in late 2022 for defrauding investors in her blood-testing start-up, adopted Steve Jobs’s uniform of black turtlenecks and spoke in a deep voice to establish her authority. When Channing Robertson, a professor at Stanford University, met Holmes, he realized that he “could have just as well been looking into the eyes of a Steve Jobs or a Bill Gates.”25

Once the nerd-philosopher-genius-king of the cryptocurrency industry before his nosedive into disgrace, Sam Bankman-Fried, the founder of crypto exchange FTX, grew his empire rapidly on the back of $2 billion raised from investors; at its peak, the start-up was valued at $32 billion. He was as notable for the breathless rise of his crypto exchange as for his shock of unkempt hair and cargo shorts, prompting The New York Times to call him a “studiously disheveled billionaire.” In fact, when a colleague told Bankman-Fried to clean up his appearance, he demurred, saying that it was all part of the image and would be helpful rather than harmful.26

Popular Culture Stereotypes the Nerd

In the 1980s, as the idea of using computer programming for commercial benefit gained steam, personal computers infiltrated people’s homes and offices. In 1982, Time put the PC on its annual “person of the year” cover—calling it “Machine of the Year.” University computer departments began filling up; in 1965, there were 64 bachelor’s degrees awarded in the field. By 1985, there were nearly 42,000 degrees.27 Although far more students graduated with business and engineering degrees during the same time frame, the enthusiasm around technology was partly a result of its newness and the swift rise to fame and fortune of such a young group of people with similar characteristics.

That’s when Hollywood’s interest in the nerd began. Nerds were portrayed as misfits who used their skills to outwit the jocks and the jerks, winning in the process social and financial capital—and, of course, the girl. A host of cult teen movies from that decade, including Sixteen Candles, War Games, Weird Science, Revenge of the Nerds, and Can’t Buy Me Love (reprised in the 2003 film Love Don’t Cost a Thing) are riffs on that core arc. Hammed up, popular culture’s nerd is a composite character, picking up qualities much as a sedimentary rock picks up layers. The nerd is most often a heterosexual white male; a maladroit of formidable intellect who would rather look at the floor than make eye contact; uncomfortable in his own body; given to fits of rage and condescension; a dedicated fan of science fiction who plays video games to let off steam; and someone who can’t be bothered to learn conventional modes of behavior or clothing. This character is also preoccupied with women, who were typically portrayed as objects of fascination and desire. Because dating women was an unattainable goal, the nerd sometimes used his tech skills to spy on them. (A few real-life examples that provide fodder: According to Gates himself, as a teenager he would rearrange class schedules at Lakeside, the Seattle private school he attended, using his computing skills to make sure he sat in classes that had the most girls. Before cofounding Facebook, Zuckerberg—allegedly in a fit of pique after being jilted by a girl—created a site on Harvard’s campus called FaceMash, where students could rate their classmates on how hot they were, according to Rolling Stone.) Women could only gain entry to the male world of the nerd if they were “cool.” In the hit show Stranger Things, the character Max is accepted by geeky middle-school boys after she outscores them in a video game.

There is also an emphasis on gaming, regularly mentioned in the biographies of tech company founders as though it were a rite of passage. Fantasy role playing games like RuneQuest and Dungeons & Dragons, creative video games like Minecraft, multiplayer games like World of Warcraft, and the abiding power of science fiction movies like Star Trek are part of the picture—an echo of Brand’s university students and computer hobbyists who spent hours playing Spacewar. Just to take one example: Fred Ehrsam, a founder of the cryptocurrency exchange Coinbase, spent thousands of hours growing up playing World of Warcraft, which is how he learned about digital currency early on, according to Forbes.

In the 1996 PBS documentary Triumph of the Nerds, a software programmer named Doug Muise tells the anchor: “Eating, bathing, having a girlfriend, having an active social life is incidental, it gets in the way of code time. Writing code is the primary force that drives our lives so anything that interrupts that is wasteful.”28 That force also seems to drive the Red Bull–drinking, Adderall-popping Zuckerberg character played by Jesse Eisenberg in the 2010 movie The Social Network.

In The Big Bang Theory, which ran on CBS from 2007 to 2019 and is considered one of the all-time great sitcom hits, the protagonist Sheldon plays the archetypal nerd. He has no social skills or filter, no ability to recognize or engage in humor, and a brain that operates on multiple levels at once. He embodies a certain lack of empathy, and no real humility or tolerance of stupidity. As the ur nerd, Bill Gates appears in a cameo in Season 11 of the show. After one of the characters wells up at the sight of his childhood hero, Gates asks him if he needs a tissue. The other character unleashes a series of oh-my-gods in ascending pitches. In an interview about the show, Gates said: “It’s fun to have a show where people are allowed to be a little nerdy, and a little bit smart, so I can relate to it, and I was thrilled when I got the chance.”

In the HBO show Silicon Valley, a tech satire that ran from 2014 to 2019 and both parodied and idealized modern nerd culture, the protagonist, Richard Hendricks—portrayed as a skinny, neurotic, socially inept entrepreneur prone to vomiting when nervous—explains the historic opportunity for nerds: “For thousands of years, guys like us have gotten the shit kicked out of us,” he says. “But for the first time we’re living in an era where we can be in charge and build empires. We could be the Vikings of our day.” In a later episode, Richard tells off a handsome entrepreneur seeking funding that tech is reserved for guys like him. “You listen to me, you handsome muscle-bound Adonis. Tech is reserved for people like me, okay? The freaks, the weirdos, the misfits, the geeks, the dweebs, the dorks.” Gates also made a cameo appearance in Silicon Valley. He said of the show: “Personally, I identify most with Richard, the founder of Pied Piper, who is a great programmer but has to learn some hard lessons about managing people.”29 (Jobs, by comparison, was so mythic as to be irreducible to stereotype.)

Sitcoms might embrace stereotypes for laughs, but there are real social consequences. Nerds are often described as being “mildly autistic,” according to Jordynn Jack, a term that is casually tossed about as a shorthand for a certain type of male who prefers technology over social interaction, representing a brain that is analytical and mathematical.30 Autism became closely linked to the fields of technology, science, and computing in the 1980s, and in that decade was most closely associated with Gates, Jack writes. The media took the term and ran with it, citing statistics about how the rates of autism are highest in Silicon Valley. When Musk disclosed that he has Asperger’s syndrome while hosting Saturday Night Live in 2021, it only embedded the storyline further. “For now, it seems that the persuasiveness of the Extreme Male Brain or Silicon Valley theories lies more in how those theories fit with our notions of gender, geekiness, and the late twentieth-century workplace than in actual statistical patterns,” according to Jack. This gendered view of autism—and its cousin Asperger’s—that is upheld in popular culture as well as scientific research does a disservice to the understanding of a complex disability that also affects women, she argues. But in an interview, Jack noted the flip side: Pop culture mentions of autism spectrum disorders can bring attention and resources to a poorly understood condition.

Many researchers have found that the stereotypical nerd image and the portrayal of nerds in hypermasculine environments can deter adolescent girls and women from entering science, technology, engineering, and math, or STEM, fields. Sapna Cheryan, a psychologist at the University of Washington, and her colleagues have found that the bigger the sense of mismatch between a person and a cultural stereotype of what computer boys are supposed to look and behave like, the more it puts off people who don’t see themselves that way, especially women.31 In a 2013 paper, Cheryan, who has studied the role that cultural stereotyping plays in attracting women to or repelling them from certain careers, highlighted the same stereotypes: technology oriented with strong interests in programming and little interest in people; solitary actors; singularly focused with no outside interests; without social skills; geniuses, brilliant; connection between nerdiness and computers, portrayed in the media; masculine interests like video games. “Taken together, the image of a computer scientist that emerges in the U.S. is one of a genius male computer hacker who spends a great deal of time alone on the computer, has an inadequate social life, and enjoys hobbies involving science fiction,” Cheryan and her coauthors write.32

Nerds Turned Bros

Every July, as business moguls fly in from all over the world to Sun Valley for their annual conference, reporters and photographers jostle to identify the attendees. With little chance of covering the actual events or even peeking at the agenda since the media is not invited, reporting on billionaire fashion has become something of a tradition over the years. Reid Hoffman, the founder of LinkedIn, in a track jacket. Tim Cook, the chief executive of Apple, in a polo shirt. Facebook’s Zuckerberg in one of his trademark gray tees. Barry Diller in a Dior aloha shirt. Michael Bloomberg in a lemon-yellow plaid button-up. Were those Allbirds on Sheryl Sandberg? Blue tinted lenses on Andreas Halvorsen, a hedge fund manager. Stacey Bendet of the fashion house Alice & Olivia in a floral dress. Buffett in a lime-green shirt patterned with the Geico mascot, a gecko.

Thus it was hardly surprising that in 2017, Sun Valley was where Bezos was noticed for his remarkable physical transformation. Gone was the nerdy, baby-faced founder of an online bookstore. In its place was a buff jock in aviator sunglasses, sporting a sleeveless puffer vest over a fitted T-shirt that showed off his bulging biceps, an appearance that gave rise to the meme “Swole Bezos.” The media moment soon passed, but the image had staying power because the transformation from wimpy nerd to uber-masculine “bro” captured perfectly the potency of today’s tech heroes. Others too have shed their nerd physicality. Zuckerberg took up jujitsu and competed in his first tournament in the spring of 2023, even winning some medals. He and Musk also challenged each other to an actual cage fight, which appeared to start as a joke on Twitter but quickly turned into a duel, revealing the bro, newly emerged from his nerd shell. (The two eventually called it off.)

Although Amazon and Microsoft are based in the Seattle area, Silicon Valley is the home base of tech exceptionalism. Considered to be the center of innovation and entrepreneurship globally, it’s a place where chaos and failure are welcomed, and wild risk-taking is encouraged. People move there to get funding for what they hope are world-changing ideas, as well as to find talent and build networks. The Valley operates on a seemingly straightforward principle: a good idea will find money and money will find a good idea. For that simple exchange to happen, the thinking goes, the place must operate by its own rules, often beyond the bounds of convention and free of government interference. Seen this way, Silicon Valley is a hermetically sealed libertarian utopia that fuses the countercultural values inherited from the original nerds and computer bums of the 1970s with hard-nosed, no-holds-barred capitalism. “Silicon Valley is a mindset, not a location,” LinkedIn’s Hoffman once told the Financial Times newspaper.33 Midsize cities around America have taken the message; there are dozens of mayors pushing to revamp disused downtown business districts into slick mini–Silicon Valleys, wooing young start-ups with lower rents and tech giants with tax breaks.

One reason why the Valley can be exclusionary is because it is so wedded to “disruption”—the idea that success comes to those who move fast and break things, and shake up business models, often without regard for consequences. “They have big ideas, but often they have very few ethics,” said Rosanne Siino, the former Netscape executive who now advises start-ups and teaches a class at Stanford about organizational dynamics. She described the Valley as a place where there is no conversation about ethics or any consideration of it. “There were never enough safeguards in place, and an unregulated market allowed people to do whatever they wanted.” Some forethought could have anticipated the downsides of social media, and Siino worried that the rush to build out products tied to artificial intelligence without thinking about the unintended consequences could have similar implications. “You’d think that with venture capitalists, there would be some sort of adult supervision in the room, but it’s none of the adult supervision that we, societally, would benefit from the most. They too are concerned mainly with money, most of them don’t care who they fund as long as they expect a certain return on investment,” Siino said. But she pointed out that the online nature of our world and social media allows the misbehavior and misogyny of tech founders to come out. In the earlier days, none of the bad behavior could be tracked.

In November 2023, a tussle at OpenAI, the leading artificial intelligence start-up, illustrated the fundamental tension between building a “change-the-world” technology with caution and guardrails and rushing to commercialize it with reckless ambition. The board members of the nonprofit that governed OpenAI who fired the company’s chief executive, Sam Altman, reportedly did so out of the concern that Silicon Valley’s move-fast-and-break-things ethos, applied to AI, could have enormous implications for humanity. But Altman—like Gates did with software—had spotted the endless market for AI applications and had pushed hard to move quickly to build a business around it. Within days, those directors had been ousted and Altman was back at the company, even more determined to use artificial intelligence for profit.

A second reason why the Valley can be exclusionary is because it underplays the role of connections when it sells the idea that merit alone finds—and makes—money. Alliances are often created among people who went to the same schools, shared the same experiences, and have similar views of the world. Hoffman is among those who have weighed repeatedly on the power of networks. He introduced Thiel, one of the founders of PayPal, to Zuckerberg when the Facebook cofounder was looking for funding for his social media start-up. Hoffman and Thiel met as sophomores at Stanford, and famously are part of the “PayPal mafia” that includes Musk. Thiel made more than a billion dollars off his $500,000 investment in Facebook.

Underlying all this is a quality of “maleness,” a nebulous concept but a helpful one because it captures the structural and institutional setup of the technology industry. As opposed to masculinity, which traditionally describes certain physical and social traits such as strength, vigor, muscularity, and assertiveness, maleness evokes an atmosphere—of comradery, of brothers-in-arms—where the scent of sweat and the fuel of testosterone propel people, and where bonding happens over coding, belching, and flatulence. Coders often sleep under their desks.

If the stereotypical portrayals of nerds that Cheryan writes about already deter many women from thinking about careers in technology, women entrepreneurs, venture capitalists, and technologists who do enter the Silicon Valley ecosystem often find themselves in an unwelcoming world. That makes it an exclusionary space for many women, men, and nonbinary people who don’t see themselves thriving in this environment of maleness. It can also be intimidating for new immigrants, many of whom don’t assert themselves for fear of being kicked out or not fitting in.

Cheryan became interested in the area when, as a first-year graduate school student at Stanford, she decided to apply for internships at technology companies. She remembers going to the office of one start-up where she walked past a conference room that was named after a Star Trek ship. “I remember thinking, I don’t know if I’d fit in,” she said. Instead, she took an internship at Adobe, purely based on what the offices of the software company looked like: airy space, a café and a gym. It provided her with a “sense of belonging.”

There are statistics to prove it. More than 90 percent of venture capital funding goes to men. Women might be earning more than before now, and in more high-profile jobs, but they own less equity in companies, meaning that the wealth creation is lower when there is an exit like an initial public offering or sale of a company. Female representation in technology and related fields remains lower than in other sectors, including finance and healthcare, especially at the entry level, according to a 2021 report.34

In such an environment, even senior female tech employees have struggled to make it work in Silicon Valley. Ellen Pao, a former partner at Kleiner Perkins who sued the storied venture capital firm in 2012 for discrimination and bias, found herself a pariah amid venture capitalists after her lawsuit, which she lost. In it, she described a culture of all-male networking, including a ski trip to Vail that she was excluded from.35 Susan Fowler, an engineer at Uber, wrote a detailed blog post in 2017 about a hostile work environment at the ride hailing company, describing a culture of sexism, sexual harassment, and retaliation. Recent efforts to promote more female founders and invest in companies excluded from the typical Valley networks hold promise, but largely, the tech world’s underlying culture of maleness is just as rock steady as ever.

At the same time, stories of early women entrepreneurs like Ann Winblad, who built an accounting software company in 1976 with borrowed money, get lost in the bigger narrative. Winblad, who has long been a successful venture capitalist, came to the wider world’s attention afresh in 2021, and that too for a tidbit about her past relationship with Gates, who had just gotten divorced. Many women who have succeeded in the Valley, mostly in the role of senior executives or venture capitalists, have learned to “play with the boys,” in the words of one. Siino said she had risen through Silicon Valley’s mostly male ranks by “not giving a fuck.”

For all their talk of changing the world, having the world operate on their terms so far hasn’t meant that tech founders have changed the terms for the better. Rather, they have simply extracted what they think they deserve. Some of the biggest tech entrepreneurs seem to be making up for a lost youth. Musk, who has said he was bullied in school, is not above behaving like a schoolyard bully, alternately taunting and picking fights, but also somehow desperate for approval. He has called his constant need to tweet a delayed adolescence.36 But nerds are also displaying the high-handed behavior we often expect from those we label as moguls, kingpins, and celebrities. Their dominance and success have conferred upon nerds a certain cachet. Nerdcore is fashionable. Their newfound wealth is a magnet. They have access to Wall Street chief executives and top politicians and rub shoulders with Hollywood and sports superstars. Even their sexual exploits and unusual romantic engagements are news fodder. In Brotopia, the journalist and television host Emily Chang describes a Valley drug-and-sex scene that she suggests is a disruption of society.37 “Their behavior at these high-end parties is an extension of the progressiveness and open-mindedness—the audacity, if you will—that make founders think they can change the world.” These parties might not have the psychological verve of Stanley Kubrick’s 1999 erotic thriller Eyes Wide Shut, but they do evoke a place where wealth can buy the most extreme sexual fantasies—although unlike in the movie, the masks are apparently off in Silicon Valley. In 2014, Andreessen mapped the evolution of computer technology from the 1950s to the 2010s onto the changing image of nerds in a series of Twitter posts titled “As the Nerds Turn.” The 1950s,’60s,’70s,’80s, and early’90s were decades when all of computer technology was equated with “nerds,” he wrote. By 2014, however, the sentiment shifted to: “Those nerds are completely out of ideas again, and now they’re having sex too!”38

One longtime observer of the Silicon Valley scene and a partner at a venture capital firm observed that nerds have become more confident, even arrogant, because they have been proved right. “The hard part is that when you suddenly have all this wealth and all this attention and are at the center of things, you enjoy the trappings of it, you get invited to the Vanity Fair Oscar party, you get the girl,” the person said. “Nobody even looked at you sideways, and now you’re the center of attention, you do get in.” The partner pointed out that society marvels at the genius that produces game-changing products and services, but like artists and musicians, that genius often comes at a price. “The fallacy is in expecting [nerds] to behave normally in every other way except their genius.”